A Decomposition Method for Weighted Least Squares Low-rank Approximation of Symmetric Matrices

نویسنده

  • JAN DE LEEUW
چکیده

Least squares approximation of a symmetric matrix C by a symmetric positive definite matrix Ĉ of rank at most p is a classical problem. It is typically solved by computing an eigen-decomposition of C and by truncating the eigen-decomposition by only using the eigenvectors and associated with the min(p, q) largest positive eigenvalues of C. Here q is the number of positive eigenvalues. Thus if q ≤ p we have rank(Ĉ) = q and if q ≥ p we have rank(Ĉ) = p.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Explicit solution of the polynomial least-squares approximation problem on Chebyshev extrema nodes

In this paper we propose an explicit solution to the polynomial least squares approximation problem on Chebyshev extrema nodes. We also show that the inverse of the normal matrix on this set of nodes can be represented as the sum of two symmetric matrices: a full rank matrix which admits a Cholesky factorization and a 2-rank matrix. Finally we discuss the numerical properties of the proposed fo...

متن کامل

On the Global Convergence of the Alternating Least Squares Method for Rank-One Approximation to Generic Tensors

Tensor decomposition has important applications in various disciplines, but it remains an extremely challenging task even to this date. A slightly more manageable endeavor has been to find a low rank approximation in place of the decomposition. Even for this less stringent undertaking, it is an established fact that tensors beyond matrices can fail to have best low rank approximations, with the...

متن کامل

Estimating a Few Extreme Singular Values and Vectors for Large-Scale Matrices in Tensor Train Format

We propose new algorithms for singular value decomposition (SVD) of very large-scale matrices based on a low-rank tensor approximation technique called the tensor train (TT) format. The proposed algorithms can compute several dominant singular values and corresponding singular vectors for large-scale structured matrices given in a TT format. The computational complexity of the proposed methods ...

متن کامل

Regularized Low Rank Approximation of Weighted Data Sets

In this paper we propose a fast and accurate method for computing a regularized low rank approximation of a weighted data set. Unlike the non-weighted case, the optimization problem posed to obtain a low rank approximation for weighted data may have local minima. To alleviate the problem with local minima, and consequently to obtain a meaningful solution, we use a priori information about the d...

متن کامل

Regularized Computation of Approximate Pseudoinverse of Large Matrices Using Low-Rank Tensor Train Decompositions

We propose a new method for low-rank approximation of Moore-Penrose pseudoinverses (MPPs) of large-scale matrices using tensor networks. The computed pseudoinverses can be useful for solving or preconditioning large-scale overdetermined or underdetermined systems of linear equations. The computation is performed efficiently and stably based on the modified alternating least squares (MALS) schem...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2006